V Identification Entropy
نویسنده
چکیده
Shannon (1948) has shown that a source (U, P, U) with output U satisfying Prob (U = u) = Pu, can be encoded in a prefix code C = {cu : u ∈ U} ⊂ {0, 1} * such that for the entropy H(P) = u∈U −pu log pu ≤ pu||cu|| ≤ H(P) + 1, where ||cu|| is the length of cu. We use a prefix code C for another purpose, namely noiseless identification , that is every user who wants to know whether a u (u ∈ U) of his interest is the actual source output or not can consider the RV C with C = cu = (cu 1 ,. .. , c u||cu||) and check whether C = (C1, C2,. . .) coincides with cu in the first, second etc. letter and stop when the first different letter occurs or when C = cu. Let LC(P, u) be the expected number of checkings, if code C is used. Our discovery is an identification entropy, namely the function HI (P) = 2 1 − u∈U P 2 u. We prove that LC(P, P) = u∈U Pu LC(P, u) ≥ HI (P) and thus also that L(P) = min C max u∈U LC(P, u) ≥ HI (P) and related upper bounds, which demonstrate the operational significance of identification entropy in noiseless source coding similar as Shan-non entropy does in noiseless data compression. Also other averages such as ¯ LC(P) = 1 |U | u∈U LC(P, u) are discussed in particular for Huffman codes where classically equivalent Huffman codes may now be different. We also show that prefix codes, where the codewords correspond to the leaves in a regular binary tree, are universally good for this average.
منابع مشابه
The phase transition of corrected black hole with f(R) gravity
In this letter, we consider static black hole in f(R) gravity.We take advantage from corrected entropy and temperature and investigate such black hole. Finally, we study the $ P - V $ critically and phase transition of corrected black hole with respect to entropy and temperature. Here also, we obtain the heat capacity for the static black hole in $ f(R) $ gravity. This calculation help us...
متن کاملAn Entropy-Like Estimator for Robust Parameter Identification
This paper describes the basic ideas behind a novel prediction error parameter identification algorithm exhibiting high robustness with respect to outlying data. Given the low sensitivity to outliers, these can be more easily identified by analysing the residuals of the fit. The devised cost function is inspired by the definition of entropy, although the method in itself does not exploit the st...
متن کاملSHANNON ENTROPY IN ORDER STATISTICS AND THEIR CONCOMITANS FROM BIVARIATE NORMAL DISTRIBUTION
In this paper, we derive rst some results on the Shannon entropyin order statistics and their concomitants arising from a sequence of f(Xi; Yi): i = 1; 2; :::g independent and identically distributed (iid) random variablesfrom the bivariate normal distribution and extend our results to a collectionC(X; Y ) = f(Xr1:n; Y[r1:n]); (Xr2:n; Y[r2:n]); :::; (Xrk:n; Y[rk:n])g of order sta-tistics and th...
متن کاملAn error-entropy minimization algorithm for supervised training of nonlinear adaptive systems
This paper investigates error-entropy-minimization in adaptive systems training. We prove the equivalence between minimization of error’s Renyi entropy of order and minimization of a Csiszar distance measure between the densities of desired and system outputs. A nonparametric estimator for Renyi’s entropy is presented, and it is shown that the global minimum of this estimator is the same as the...
متن کاملUsing Shannon Entropy as EEG Signal Feature for Fast Person Identification
Identification accuracy and speed are important factors in automatic person identification systems. In this paper, we propose a feature extraction method to extract brain wave features from different brain rhythms of electroencephalography (EEG) signal for the purpose of fast, yet accurate person identification. The proposed feature extraction method is based on the fact that EEG signal is comp...
متن کامل